Learning Multiple Related Tasks using Latent Independent Component Analysis
نویسندگان
چکیده
We propose a probabilistic model based on Independent Component Analysis for learning multiple related tasks. In our model the task parameters are assumed to be generated from independent sources which account for the relatedness of the tasks. We use Laplace distributions to model hidden sources which makes it possible to identify the hidden, independent components instead of just modeling correlations. Furthermore, our model enjoys a sparsity property which makes it both parsimonious and robust. We also propose efficient algorithms for both empirical Bayes method and point estimation. Our experimental results on two multi-label text classification data sets show that the proposed approach is promising.
منابع مشابه
The Effects of Task Orientation and Involvement Load on Learning Collocations
This study examined the effects of input-oriented and output-oriented tasks with different involvement load indices on Iranian EFL learners' comprehension and production of lexical collocations. To achieve this purpose, a sample of 180 intermediate-level EFL learners (both male and female) participated in the study. The participants were in six experimental groups. Each of the groups was random...
متن کاملFuzzy Local ICA for Extracting Independent Components Related to External Criteria
Independent component analysis (ICA) is an unsupervised technique for blind source separation, and the ICA algorithms using nongaussianity as the measure of mutual independence have been also used for projection pursuit or visualization of multivariate data for knowledge discovery in databases (KDD). However, in real applications, it is often the case that we fail to extract useful latent varia...
متن کاملDifferential learning algorithms for decorrelation and independent component analysis
Decorrelation and its higher-order generalization, independent component analysis (ICA), are fundamental and important tasks in unsupervised learning, that were studied mainly in the domain of Hebbian learning. In this paper we present a variation of the natural gradient ICA, differential ICA, where the learning relies on the concurrent change of output variables. We interpret the differential ...
متن کاملSimilarity Component Analysis
Measuring similarity is crucial to many learning tasks. To this end, metric learning has been the dominant paradigm. However, similarity is a richer and broader notion than what metrics entail. For example, similarity can arise from the process of aggregating the decisions of multiple latent components, where each latent component compares data in its own way by focusing on a different subset o...
متن کاملVariational Learning in Nonlinear Gaussian Belief Networks
We view perceptual tasks such as vision and speech recognition as inference problems where the goal is to estimate the posterior distribution over latent variables (e.g., depth in stereo vision) given the sensory input. The recent flurry of research in independent component analysis exemplifies the importance of inferring the continuous-valued latent variables of input data. The latent variable...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005